Evolution in Groups: A deeper look at synaptic cluster driven evolution of deep neural networks
نویسندگان
چکیده
A promising paradigm for achieving highly efficient deep neural networks is the idea of evolutionary deep intelligence, which mimics biological evolution processes to progressively synthesize more efficient networks. A crucial design factor in evolutionary deep intelligence is the genetic encoding scheme used to simulate heredity and determine the architectures of offspring networks. In this study, we take a deeper look at the notion of synaptic cluster-driven evolution of deep neural networks which guides the evolution process towards the formation of a highly sparse set of synaptic clusters in offspring networks. Utilizing a synaptic cluster-driven genetic encoding, the probabilistic encoding of synaptic traits considers not only individual synaptic properties but also inter-synaptic relationships within a deep neural network. This process results in highly sparse offspring networks which are particularly tailored for parallel computational devices such as GPUs and deep neural network accelerator chips. Comprehensive experimental results using four well-known deep neural network architectures (LeNet-5, AlexNet, ResNet-56, and DetectNet) on two different tasks (object categorization and object detection) demonstrate the efficiency of the proposed method. Cluster-driven genetic encoding scheme synthesizes networks that can achieve state-of-the-art performance with significantly smaller number of synapses than that of the original ancestor network. (∼125-fold decrease in synapses for MNIST). Furthermore, the improved cluster efficiency in the generated offspring networks (∼9.71-fold decrease in clusters for MNIST and a ∼8.16-fold decrease in clusters for KITTI) is particularly useful for accelerated performance on parallel computing hardware architectures such as those in GPUs and deep neural network accelerator chips.
منابع مشابه
Evolutionary Synthesis of Deep Neural Networks via Synaptic Cluster-driven Genetic Encoding
There has been significant recent interest towards achieving highly efficient deep neural network architectures that preserve strong modeling capabilities. A particular promising paradigm for achieving such deep neural networks is the concept of evolutionary deep intelligence, which attempts to mimic biological evolution processes to synthesize highly-efficient deep neural networks over success...
متن کاملSpike timing dependent plasticity: mechanisms, significance, and controversies
Long-term modification of synaptic strength is one of the basic mechanisms of memory formation and activity-dependent refinement of neural circuits. This idea was purposed by Hebb to provide a basis for the formation of a cell assembly. Repetitive correlated activity of pre-synaptic and post-synaptic neurons can induce long-lasting synaptic strength modification, the direction and extent of whi...
متن کاملSpike timing dependent plasticity: mechanisms, significance, and controversies
Long-term modification of synaptic strength is one of the basic mechanisms of memory formation and activity-dependent refinement of neural circuits. This idea was purposed by Hebb to provide a basis for the formation of a cell assembly. Repetitive correlated activity of pre-synaptic and post-synaptic neurons can induce long-lasting synaptic strength modification, the direction and extent of whi...
متن کاملImage Backlight Compensation Using Recurrent Functional Neural Fuzzy Networks Based on Modified Differential Evolution
In this study, an image backlight compensation method using adaptive luminance modification is proposed for efficiently obtaining clear images.The proposed method combines the fuzzy C-means clustering method, a recurrent functional neural fuzzy network (RFNFN), and a modified differential evolution.The proposed RFNFN is based on the two backlight factors that can accurately detect the compensat...
متن کاملA Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1704.02081 شماره
صفحات -
تاریخ انتشار 2017